|
|
Preference Feature Extraction Based on Column Union Row Matrix Decomposition |
LEI Hengxin, LIU Jinglei |
School of Computer and Control Engineering, Yantai University, Yantai 264005 |
|
|
Abstract Preference features can not be accurately analyzed and explained by singular value decomposition. Aiming at these problems, a column union row(CUR) matrix decomposition method is proposed to acquire a low-rank approximation of the original matrix M (user preferences for products) and extract the potential preferences of users and products. The statistics leverage score of matrix M is calculated firstly. And then, several rows and columns with higher scores are extracted to constitute low-dimensional matrix C and matrix R. Subsequently, the matrix U is constructed approximatively according to matrix M, C and R. By the proposed method, the extraction problem of preference feature in a high-dimensional space is transformed to the matrix analysis problem in a lower dimensional space. As a consequence, the CUR decomposition has better accuracy and interpretability. Finally, the theoretical analysis and experiment indicate that compared with the traditional decomposition methods, the CUR matrix decomposition method has higher accuracy, better interpretability and higher compression ratio for extracting preference feature.
|
Received: 13 May 2016
|
|
Fund:Supported by National Natural Science Foundation of China(No.61572419,61572418,61403328,61403329), Natural Science Foundation of Shandong Province(No.2015GSF115009,ZR2014FQ016,ZR2014FQ026,ZR2013FM011) |
About author:: LEI Hengxin, bron in 1993, master student. His research interests include matrix decomposition and its application in the recommendation system and the feature extraction of preference. (LIU Jinglei(Corresponding author), born in 1970, master, associate professor. His research interests include artificial intelligence and theoretical computer science.) |
|
|
|
[1] 朱 锐,王怀民,冯大为.基于偏好推荐的可信服务选择.软件学报, 2011, 22(5): 852-864. (ZHU R, WANG H M, FENG D W. Trustworthy Service Selection Based on Preference Recommendation. Journal of Software, 2011, 22(5): 852-864.) [2] BUSA-FEKETE R, SZ R NYI B, WENG P, et al. Preference-based Reinforcement Learning: Evolutionary Direct Policy Search Using a Preference-Based Racing Algorithm. Machine Learning, 2014, 97(3): 327-351. [3] FORSATI R, MAHDAVI M, SHAMSFARD M, et al. Matrix Factorization with Explicit Trust and Distrust Side Information for Improved Social Recommendation. ACM Transactions on Information Systems, 2014, 32(4). DOI: 10.1145/2641564. [4] BOBADILLA J, ORTEGA F, HERNANDO A, et al. Recommender Systems Survey. Knowledge-Based Systems, 2013, 46: 109-132. [5] 吴金龙.Netflix Prize中的协同过滤算法.博士学位论文.北京:北京大学, 2010. (WU J L. Collaborative Filtering Algorithm in Prize Netflix. Ph.D Dissertation. Beijing, China: Peking University, 2010.) [6] DRINEAS P, KANNAN R. Pass Efficient Algorithms for Approximating Large Matrices // Proc of the 14th Annual ACM-SIAM Symposium on Discrete Algorithms. Philadelphia, USA: Society for Industrial and Applied Mathematics Philadelphia, 2003: 223-232. [7] BOUTSIDIS C, WOODRUFF D P. Optimal CUR Matrix Decompositions // Proc of the 46th Annual ACM Symposium on Theory of Computing. New York, USA: ACM, 2014: 353-362. [8] RAJARAMAN A, ULLMAN J D. Mining of Massive Datasets[DB/OL]. [2016-03-07]. http://infolab.stanford.edu/~ullman/mmds/book.pdf. [9] DRINEAS P, MAHONEY M W, Muthukrishnan S. Relative-Error CUR Matrix Decompositions. SIAM Journal on Matrix Analysis and Applications, 2008, 30(2): 844-811. [10] WEIMER M, KARATZOGLOU A, SMOLA A. Improving Maximum Margin Matrix Factorization. Machine Learning, 2008, 72(3): 263-276. [11] CHU M T, LIN M M. Low-Dimensional Polytope Approximation and Its Applications to Nonnegative Matrix Factorization. SIAM Journal on Scientific Computing, 2008, 30(3): 1131-1155. [12] OCEPEK U, RUGELJ J, BOSNIC Z. Improving Matrix Factorization Recommendations for Examples in Cold Start. Expert Systems with Applications: An International Journal, 2015, 42(19): 6784-6794. [13] CHICKERING D M, HECKERMAN D. Fast Learning from Sparse Data // Proc of the 15th Conference on Uncertainty in Artificial Intelligence. San Francisco, USA: Morgan Kaufmann Publishers, 1999: 109-115. [14] ZHOU X W, YANG C, ZHAO H Y, et al. Low-Rank Modeling and Its Applications in Image Analysis. ACM Computing Surveys, 2014, 47(2): 36:1-36:33. [15] SPRECHMANN P, BRONSTEIN A M, SAPIRO G. Learning Efficient Sparse and Low Rank Models. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2012, 37(9): 1821-1833. [16] DRINEAS P, KANNAN R, MAHONEY M W. Fast Montecarlo Algorithms for Matrices II: Computing a Low-Rank Approximation to a Matrix. SIAM Journal on Computing, 2006, 36(1): 158-183. [17] 刘惊雷.CP-nets及其表达能力研究.自动化学报, 2011, 37(3): 290-302. (LIU J L. Research on CP-nets and Its Expressive Power. Acta Automatica Sinica, 2011, 37(3): 290-302.) [18] ACHLIOPTAS D, MCSHERRY F. Fast Computation of Low-Rank Matrix Approximations. Journal of the ACM, 2007, 54(2). DOI: 10.1145/1219092.1219097. [19] DRINEAS P, KANNAN R, MAHOHEY M W. Fast Montecarlo Algorithms for Matrices I: Approximating Matrix Multiplication. SIAM Journal on Computing, 2006, 36(1): 132-157. [20] MAHONEY M W, DRINEAS P, KLEINBERG J. CUR Matrix Decompositions for Improved Data Analysis. Proceedings of the National Academy of Sciences of the United States of America, 2009, 106(3): 697-702. |
|
|
|